Block Regularized Lasso for Multivariate Multi-Response Linear Regression
نویسندگان
چکیده
The multivariate multi-response (MVMR) linear regression problem is investigated, in which design matrices are Gaussian with covariance matrices Σ = ( Σ, . . . ,Σ ) for K linear regressions. The support union of K p-dimensional regression vectors (collected as columns of matrix B∗) are recovered using l1/l2-regularized Lasso. Sufficient and necessary conditions to guarantee successful recovery of the support union are characterized via a threshold. More specifically, it is shown that under certain conditions on the distributions of design matrices, if n > cp1ψ(B ∗,Σ(1:K)) log(p − s) where cp1 is a constant, and s is the size of the support set, then l1/l2-regularized Lasso correctly recovers the support union; and if n < cp2ψ(B ∗,Σ(1:K)) log(p − s) where cp2 is a constant, then l1/l2-regularized Lasso fails to recover the support union. In particular, ψ(B∗,Σ(1:K)) captures the impact of the sparsity of K regression vectors and the statistical properties of the design matrices on the threshold for support recovery. Numerical results are provided to demonstrate the advantages of joint support union recovery using multi-task Lasso over individual support recovery using single-task Lasso.
منابع مشابه
Sharp Threshold for Multivariate Multi-Response Linear Regression via Block Regularized Lasso
for K linear regressions. The support union of K p-dimensional regression vectors (collected as columns of matrix B∗) is recovered using l1/l2-regularized Lasso. Sufficient and necessary conditions on sample complexity are characterized as a sharp threshold to guarantee successful recovery of the support union. This model has been previously studied via l1/l∞regularized Lasso by Negahban & Wain...
متن کاملOn the Q-linear Convergence of a Majorized Proximal ADMM for Convex Composite Programming and Its Applications to Regularized Logistic Regression
This paper aims to study the convergence rate of a majorized alternating direction method of multiplier with indefinite proximal terms (iPADMM) for solving linearly constrained convex composite optimization problems. We establish the Q-linear rate convergence theorem for 2-block majorized iPADMM under mild conditions. Based on this result, the convergence rate analysis of symmetric Gaussian-Sei...
متن کاملCovariance-regularized regression and classification for high-dimensional problems.
In recent years, many methods have been developed for regression in high-dimensional settings. We propose covariance-regularized regression, a family of methods that use a shrunken estimate of the inverse covariance matrix of the features in order to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing its log likelihood, under a multivariate norma...
متن کاملPerformance Analysis Of Regularized Linear Regression Models For Oxazolines And Oxazoles Derivitive Descriptor Dataset
Regularized regression techniques for linear regression have been created the last few ten years to reduce the flaws of ordinary least squares regression with regard to prediction accuracy. In this paper, new methods for using regularized regression in model choice are introduced, and we distinguish the conditions in which regularized regression develops our ability to discriminate models. We a...
متن کاملEstimation and Selection via Absolute Penalized Convex Minimization And Its Multistage Adaptive Applications
The ℓ1-penalized method, or the Lasso, has emerged as an important tool for the analysis of large data sets. Many important results have been obtained for the Lasso in linear regression which have led to a deeper understanding of high-dimensional statistical problems. In this article, we consider a class of weighted ℓ1-penalized estimators for convex loss functions of a general form, including ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2013